Goto

Collaborating Authors

 Dhaka Division









Importance Weighted Variational Inference without the Reparameterization Trick

Daudel, Kamélia, Tran, Minh-Ngoc, Zhang, Cheng

arXiv.org Machine Learning

Importance weighted variational inference (VI) approximates densities known up to a normalizing constant by optimizing bounds that tighten with the number of Monte Carlo samples $N$. Standard optimization relies on reparameterized gradient estimators, which are well-studied theoretically yet restrict both the choice of the data-generating process and the variational approximation. While REINFORCE gradient estimators do not suffer from such restrictions, they lack rigorous theoretical justification. In this paper, we provide the first comprehensive analysis of REINFORCE gradient estimators in importance weighted VI, leveraging this theoretical foundation to diagnose and resolve fundamental deficiencies in current state-of-the-art estimators. Specifically, we introduce and examine a generalized family of variational inference for Monte Carlo objectives (VIMCO) gradient estimators. We prove that state-of-the-art VIMCO gradient estimators exhibit a vanishing signal-to-noise ratio (SNR) as $N$ increases, which prevents effective optimization. To overcome this issue, we propose the novel VIMCO-$\star$ gradient estimator and show that it averts the SNR collapse of existing VIMCO gradient estimators by achieving a $\sqrt{N}$ SNR scaling instead. We demonstrate its superior empirical performance compared to current VIMCO implementations in challenging settings where reparameterized gradients are typically unavailable.


Robust, Accurate Stochastic Optimization for Variational Inference

Neural Information Processing Systems

We examine the accuracy of black box variational posterior approximations for parametric models in a probabilistic programming context. The performance of these approximations depends on (1) how well the variational family approximates the true posterior distribution, (2) the choice of divergence, and (3) the optimization of the variational objective. We show that even when the true variational family is used, high-dimensional posteriors can be very poorly approximated using common stochastic gradient descent (SGD) optimizers. Motivated by recent theory, we propose a simple and parallel way to improve SGD estimates for variational inference. The approach is theoretically motivated and comes with a diagnostic for convergence and a novel stopping rule, which is robust to noisy objective functions evaluations. We show empirically, the new workflow works well on a diverse set of models and datasets, or warns if the stochastic optimization fails or if the used variational distribution is not good.


Tokyo couple die in sauna fire after being trapped inside

BBC News

A husband and wife have died after being trapped in a private sauna room that caught fire in Japan on Monday. Tokyo police are investigating whether a faulty doorknob trapped the couple inside the room at Sauna Tiger, in the city's Akasaka district, local media has reported. Investigators also found that the facility's emergency alarm system was switched off, and allegedly had been for two years. We offer our deepest condolences... and our heartfelt sympathies for the deep grief and pain that cannot be expressed in words, Sauna Tiger said in a statement on its website. The victims have been named by local media as Yoko Matsuda, a 37-year-old nail artist, and her husband Masanari, 36, who ran a beauty salon.